A note on convergence rates of Gibbs sampling for nonparametric mixtures∗
نویسندگان
چکیده
We present a mathematical analysis of a class of Gibbs sampler algorithms for nonparametric mixtures, which use Dirichlet process priors and have updating steps which are partially discrete and partially continuous. We prove that such Gibbs samplers are uniformly ergodic, and we give a quantitative bound on their convergence rate. In a special case we can give a much sharper quantitative bound; however, in general the problem of sharper quantitative bounds remains open.
منابع مشابه
Posterior Convergence Rates of Dirichlet Mixtures at Smooth Densities
We study the rates of convergence of the posterior distribution for Bayesian density estimation with Dirichlet mixtures of normal distributions as the prior. The true density is assumed to be twice continuously differentiable. The bandwidth is given a sequence of priors which is obtained by scaling a single prior by an appropriate order. In order to handle this problem, we derive a new general ...
متن کاملStudying Convergence of Markov Chain Monte Carlo Algorithms Using Coupled Sample Paths
I describe a simple procedure for investigating the convergence properties of Markov Chain Monte Carlo sampling schemes. The procedure employs multiple runs from a sampler, using the same random deviates for each run. When the sample paths from all sequences converge, it is argued that approximate equilibrium conditions hold. The procedure also provides a simple diagnostic for detecting modes i...
متن کاملVariational inference for Dirichlet process mixtures
Dirichlet process (DP) mixture models are the cornerstone of nonparametric Bayesian statistics, and the development of Monte-Carlo Markov chain (MCMC) sampling methods for DP mixtures has enabled the application of nonparametric Bayesian methods to a variety of practical data analysis problems. However, MCMC sampling can be prohibitively slow, and it is important to explore alternatives. One cl...
متن کاملDirichlet Process Parsimonious Mixtures for clustering
The parsimonious Gaussian mixture models, which exploit an eigenvalue decomposition of the group covariance matrices of the Gaussian mixture, have shown their success in particular in cluster analysis. Their estimation is in general performed by maximum likelihood estimation and has also been considered from a parametric Bayesian prospective. We propose new Dirichlet Process Parsimonious mixtur...
متن کاملBayesian Nonparametric Calibration and Combination of Predictive Distributions
We introduce a Bayesian approach to predictive density calibration and combination that accounts for parameter uncertainty and model set incompleteness through the use of random calibration functionals and random combination weights. Building on the work of Ranjan, R. and Gneiting, T. (2010) and Gneiting, T. and Ranjan, R. (2013), we use in finite beta mixtures for the calibration. The proposed...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1998